Extracting stochastic machines from recurrent neural networks trained on complex symbolic sequences

نویسندگان

  • Peter Tiño
  • V. Vojtek
چکیده

We train recurrent neural network on a single, long, complex symbolic sequence with positive entropy. Training process is monitored through information theory based performance measures. We show that although the sequence is unpredictable, the network is able to code the sequence topological and statistical structure in recurrent neurons' activation scenarios. Such scenarios can be compactly represented through stochastic machines extracted from the trained network. Generative models, i.e. trained recurrent networks and extracted stochastic machines, are compared using entropy spectra of generated sequences. In addition, entropy spectra computed directly from the machines capture generalization abilities of extracted machines and are related to machines' long term behavior.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Extracting finite-state representations from recurrent neural networks trained on chaotic symbolic sequences

While much work has been done in neural-based modeling of real-valued chaotic time series, little effort has been devoted to address similar problems in the symbolic domain. We investigate the knowledge induction process associated with training recurrent neural networks (RNN's) on single long chaotic symbolic sequences. Even though training RNN's to predict the next symbol leaves the standard ...

متن کامل

Title of dissertation : EXTRACTING SYMBOLIC REPRESENTATIONS LEARNED BY NEURAL NETWORKS

Title of dissertation: EXTRACTING SYMBOLIC REPRESENTATIONS LEARNED BY NEURAL NETWORKS Thuan Q. Huynh, Doctor of Philosophy, 2012 Dissertation directed by: Professor James A. Reggia Department of Computer Science Understanding what neural networks learn from training data is of great interest in data mining, data analysis, and critical applications, and in evaluating neural network models. Unfor...

متن کامل

Online Symbolic-Sequence Prediction with Discrete-Time Recurrent Neural Networks

This paper studies the use of discrete-time recurrent neural networks for predicting the next symbol in a sequence. The focus is on online prediction, a task much harder than the classical offline grammatical inference with neural networks. The results obtained show that the performance of recurrent networks working online is acceptable when sequences come from finite-state machines or even fro...

متن کامل

Robust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays

In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...

متن کامل

Inferring stochastic regular grammars with recurrent neural networks

Recent work has shown that the extraction of symbolic rules improves the generalization performance of recurrent neural networks trained with complete (positive and negative) samples of regular languages. This paper explores the possibility of inferring the rules of the language when the network is trained instead with stochastic, positive-only data. For this purpose, a recurrent network with t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997